Fast Kernel Classifier Construction Using Orthogonal Forward Selection to Minimise Leave-One-Out Misclassification Rate

نویسندگان

  • Xia Hong
  • Sheng Chen
  • Christopher J. Harris
چکیده

We propose a simple yet computationally efficient construction algorithm for two-class kernel classifiers. In order to optimise classifier’s generalisation capability, an orthogonal forward selection procedure is used to select kernels one by one by minimising the leave-one-out (LOO) misclassification rate directly. It is shown that the computation of the LOO misclassification rate is very efficient owing to orthogonalisation. Examples are used to demonstrate that the proposed algorithm is a viable alternative to construct sparse two-class kernel classifiers in terms of performance and computational efficiency.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A fast linear-in-the-parameters classifier construction algorithm using orthogonal forward selection to minimize leave-one-out misclassification rate

International Journal of Systems Science Publication details, including instructions for authors and subscription information: http://www.informaworld.com/smpp/title~content=t713697751 A fast linear-in-the-parameters classifier construction algorithm using orthogonal forward selection to minimize leave-one-out misclassification rate X. Hong a; S. Chen b; C. J. Harris b a School of Systems Engin...

متن کامل

Construction of RBF Classifiers with Tunable Units Using Orthogonal Forward Selection Based on Leave-one-out Misclassification Rate [IJCNN1219]

An orthogonal forward selection (OFS) algorithm based on leave-one-out (LOO) misclassification rate is proposed for the construction of radial basis function (RBF) classifiers with tunable units. Each stage of the construction process determines a RBF unit, namely its centre vector and diagonal covariance matrix as well as weight, by minimising the LOO statistics. This OFS-LOO algorithm is comp...

متن کامل

Orthogonal Forward Regression based on Directly Maximizing Model Generalization Capability

The paper introduces a construction algorithm for sparse kernel modelling using the leave-one-out test score also known as the PRESS (Predicted REsidual Sums of Squares) statistic. An efficient subset model selection procedure is developed in the orthogonal forward regression framework by incrementally maximizing the model generalization capability to construct sparse models with good generaliz...

متن کامل

PDFOS: PDF estimation based over-sampling for imbalanced two-class problems

This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to...

متن کامل

A radial basis function network classifier to maximise leave-one-out mutual information

We develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF) network classifiers for two-class problems. Our approach integrates several concepts in probabilistic modelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At each stage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006